826 research outputs found

    Mesoscopic theory for inhomogeneous mixtures

    Full text link
    Mesoscopic density functional theory for inhomogeneous mixtures of sperical particles is developed in terms of mesoscopic volume fractions by a systematic coarse-graining procedure starting form microscopic theory. Approximate expressions for the correlation functions and for the grand potential are obtained for weak ordering on mesoscopic length scales. Stability analysis of the disordered phase is performed in mean-field approximation (MF) and beyond. MF shows existence of either a spinodal or a λ\lambda-surface on the volume-fractions - temperature phase diagram. Separation into homogeneous phases or formation of inhomogeneous distribution of particles occurs on the low-temperature side of the former or the latter surface respectively, depending on both the interaction potentials and the size ratios between particles of different species. Beyond MF the spinodal surface is shifted, and the instability at the λ\lambda-surface is suppressed by fluctuations. We interpret the λ\lambda-surface as a borderline between homogeneous and inhomogeneous (containing clusters or other aggregates) structure of the disordered phase. For two-component systems explicit expressions for the MF spinodal and λ\lambda-surfaces are derived. Examples of interaction potentials of simple form are analyzed in some detail, in order to identify conditions leading to inhomogeneous structures.Comment: 6 figure

    Irregular Persistent Activity Induced by Synaptic Excitatory Feedback

    Get PDF
    Neurophysiological experiments on monkeys have reported highly irregular persistent activity during the performance of an oculomotor delayed-response task. These experiments show that during the delay period the coefficient of variation (CV) of interspike intervals (ISI) of prefrontal neurons is above 1, on average, and larger than during the fixation period. In the present paper, we show that this feature can be reproduced in a network in which persistent activity is induced by excitatory feedback, provided that (i) the post-spike reset is close enough to threshold , (ii) synaptic efficacies are a non-linear function of the pre-synaptic firing rate. Non-linearity between pre-synaptic rate and effective synaptic strength is implemented by a standard short-term depression mechanism (STD). First, we consider the simplest possible network with excitatory feedback: a fully connected homogeneous network of excitatory leaky integrate-and-fire neurons, using both numerical simulations and analytical techniques. The results are then confirmed in a network with selective excitatory neurons and inhibition. In both the cases there is a large range of values of the synaptic efficacies for which the statistics of firing of single cells is similar to experimental data

    Admit your weakness: Verifying correctness on TSO architectures

    Get PDF
    “The final publication is available at http://link.springer.com/chapter/10.1007%2F978-3-319-15317-9_22 ”.Linearizability has become the standard correctness criterion for fine-grained non-atomic concurrent algorithms, however, most approaches assume a sequentially consistent memory model, which is not always realised in practice. In this paper we study the correctness of concurrent algorithms on a weak memory model: the TSO (Total Store Order) memory model, which is commonly implemented by multicore architectures. Here, linearizability is often too strict, and hence, we prove a weaker criterion, quiescent consistency instead. Like linearizability, quiescent consistency is compositional making it an ideal correctness criterion in a component-based context. We demonstrate how to model a typical concurrent algorithm, seqlock, and prove it quiescent consistent using a simulation-based approach. Previous approaches to proving correctness on TSO architectures have been based on linearizabilty which makes it necessary to modify the algorithm’s high-level requirements. Our approach is the first, to our knowledge, for proving correctness without the need for such a modification

    Working Memory Cells' Behavior May Be Explained by Cross-Regional Networks with Synaptic Facilitation

    Get PDF
    Neurons in the cortex exhibit a number of patterns that correlate with working memory. Specifically, averaged across trials of working memory tasks, neurons exhibit different firing rate patterns during the delay of those tasks. These patterns include: 1) persistent fixed-frequency elevated rates above baseline, 2) elevated rates that decay throughout the tasks memory period, 3) rates that accelerate throughout the delay, and 4) patterns of inhibited firing (below baseline) analogous to each of the preceding excitatory patterns. Persistent elevated rate patterns are believed to be the neural correlate of working memory retention and preparation for execution of behavioral/motor responses as required in working memory tasks. Models have proposed that such activity corresponds to stable attractors in cortical neural networks with fixed synaptic weights. However, the variability in patterned behavior and the firing statistics of real neurons across the entire range of those behaviors across and within trials of working memory tasks are typical not reproduced. Here we examine the effect of dynamic synapses and network architectures with multiple cortical areas on the states and dynamics of working memory networks. The analysis indicates that the multiple pattern types exhibited by cells in working memory networks are inherent in networks with dynamic synapses, and that the variability and firing statistics in such networks with distributed architectures agree with that observed in the cortex

    Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory

    Full text link
    We investigate from a computational perspective the efficiency of the Willshaw synaptic update rule in the context of familiarity discrimination, a binary-answer, memory-related task that has been linked through psychophysical experiments with modified neural activity patterns in the prefrontal and perirhinal cortex regions. Our motivation for recovering this well-known learning prescription is two-fold: first, the switch-like nature of the induced synaptic bonds, as there is evidence that biological synaptic transitions might occur in a discrete stepwise fashion. Second, the possibility that in the mammalian brain, unused, silent synapses might be pruned in the long-term. Besides the usual pattern and network capacities, we calculate the synaptic capacity of the model, a recently proposed measure where only the functional subset of synapses is taken into account. We find that in terms of network capacity, Willshaw learning is strongly affected by the pattern coding rates, which have to be kept fixed and very low at any time to achieve a non-zero capacity in the large network limit. The information carried per functional synapse, however, diverges and is comparable to that of the pattern association case, even for more realistic moderately low activity levels that are a function of network size.Comment: 20 pages, 4 figure

    Mean-field cooperativity in chemical kinetics

    Full text link
    We consider cooperative reactions and we study the effects of the interaction strength among the system components on the reaction rate, hence realizing a connection between microscopic and macroscopic observables. Our approach is based on statistical mechanics models and it is developed analytically via mean-field techniques. First of all, we show that, when the coupling strength is set positive, the model is able to consistently recover all the various cooperative measures previously introduced, hence obtaining a single unifying framework. Furthermore, we introduce a criterion to discriminate between weak and strong cooperativity, based on a measure of "susceptibility". We also properly extend the model in order to account for multiple attachments phenomena: this is realized by incorporating within the model pp-body interactions, whose non-trivial cooperative capability is investigated too.Comment: 25 pages, 4 figure

    d=3 Bosonic Vector Models Coupled to Chern-Simons Gauge Theories

    Full text link
    We study three dimensional O(N)_k and U(N)_k Chern-Simons theories coupled to a scalar field in the fundamental representation, in the large N limit. For infinite k this is just the singlet sector of the O(N) (U(N)) vector model, which is conjectured to be dual to Vasiliev's higher spin gravity theory on AdS_4. For large k and N we obtain a parity-breaking deformation of this theory, controlled by the 't Hooft coupling lambda = 4 \pi N / k. For infinite N we argue (and show explicitly at two-loop order) that the theories with finite lambda are conformally invariant, and also have an exactly marginal (\phi^2)^3 deformation. For large but finite N and small 't Hooft coupling lambda, we show that there is still a line of fixed points parameterized by the 't Hooft coupling lambda. We show that, at infinite N, the interacting non-parity-invariant theory with finite lambda has the same spectrum of primary operators as the free theory, consisting of an infinite tower of conserved higher-spin currents and a scalar operator with scaling dimension \Delta=1; however, the correlation functions of these operators do depend on lambda. Our results suggest that there should exist a family of higher spin gravity theories, parameterized by lambda, and continuously connected to Vasiliev's theory. For finite N the higher spin currents are not conserved.Comment: 34 pages, 29 figures. v2: added reference

    Balanced Input Allows Optimal Encoding in a Stochastic Binary Neural Network Model: An Analytical Study

    Get PDF
    Recent neurophysiological experiments have demonstrated a remarkable effect of attention on the underlying neural activity that suggests for the first time that information encoding is indeed actively influenced by attention. Single cell recordings show that attention reduces both the neural variability and correlations in the attended condition with respect to the non-attended one. This reduction of variability and redundancy enhances the information associated with the detection and further processing of the attended stimulus. Beyond the attentional paradigm, the local activity in a neural circuit can be modulated in a number of ways, leading to the general question of understanding how the activity of such circuits is sensitive to these relatively small modulations. Here, using an analytically tractable neural network model, we demonstrate how this enhancement of information emerges when excitatory and inhibitory synaptic currents are balanced. In particular, we show that the network encoding sensitivity -as measured by the Fisher information- is maximized at the exact balance. Furthermore, we find a similar result for a more realistic spiking neural network model. As the regime of balanced inputs has been experimentally observed, these results suggest that this regime is functionally important from an information encoding standpoint
    corecore